• The paper titled "Knowledge Graph Embedding by Normalizing Flows," authored by Changyi Xiao, Xiangnan He, and Yixin Cao, presents a novel approach to knowledge graph embedding (KGE) by integrating concepts from group theory and normalizing flows. The authors emphasize the importance of selecting an appropriate representation space for KGE, which can include point-wise Euclidean space and complex vector space. The central thesis of the paper is the introduction of uncertainty into KGE through a unified perspective that allows for the incorporation of existing models while maintaining computational efficiency and expressiveness. The authors propose embedding entities and relations as elements of a symmetric group, which consists of permutations of a set. This approach enables the representation of different properties of embeddings through various permutations, while the group operations within symmetric groups are computationally manageable. To address uncertainty, the authors suggest embedding entities and relations as permutations of a set of random variables. This transformation allows simple random variables to evolve into complex random variables, enhancing expressiveness through a mechanism known as normalizing flows. The authors define scoring functions based on the similarity of two normalizing flows, referred to as NFE, and demonstrate that their model can learn logical rules effectively. The experimental results presented in the paper validate the effectiveness of incorporating uncertainty into KGE, showcasing the advantages of their proposed model. The authors also provide access to the code associated with their research, encouraging further exploration and application of their findings in the field of machine learning and artificial intelligence.